587 research outputs found

    Inverse Ontomimetic Simulation: a window on complex systems

    Get PDF
    The present paper introduces "ontomimetic simulation" and argues that this class of models has enabled the investigation of hypotheses about complex systems in new ways that have epistemological relevance. Ontomimetic simulation can be differentiated from other types of modeling by its reliance on causal similarity in addition to representation. Phenomena are modeled not directly but via mimesis of the ontology (i.e. the "underlying physics", microlevel etc.) of systems and a subsequent animation of the resulting model ontology as a dynamical system. While the ontology is clearly used for computing system states, what is epistemologically important is that it is viewed as a hypothesis about the makeup of the studied system. This type of simulation, where model ontologies are used as hypotheses, is here called inverse ontomimetic simulation since it reverses the typical informational path from the target to the model system. It links experimental and analytical techniques in being explicitly dynamical while at the same time capable of abstraction. Inverse ontomimetic simulation is argued to have a great impact on science and to be the tool for hypothesis-testing that has made systematic theory development for complex systems possible

    Alcohol Use and Stress in University Freshmen - A Comparative Intervention Study of Two Universities

    Get PDF
    Starting university is associated with major academic, personal and social opportunities. For many people, university entrance is also associated with increased stress and alcohol consumption. At the start of the autumn term 2002, all students entering educational programmes at two comparable middle-sized Swedish universities were invited to participate in a comparative intervention study. This included both primary and secondary interventions targeting hazardous drinking and stress. The overall aim was to improve alcohol habits and stress patterns in university freshmen at an intervention university in comparison with a control university. A total of 2,032 (72%) freshmen responded to the baseline assessment. Half of them scored above traditional AUDIT cut-off levels for hazardous alcohol use. Factors associated with hazardous use were age below 26, male gender, family history of alcohol problems, and not being in a serious relationship. The Arnetz and Hasson Stress Questionnaire was evaluated and used to study a selection of freshmen at high riskof stress. It was easy to use and offered sufficient internal consistency and construct validity. In the freshman year, 517 students (25%) dropped out from university education. A multivariate analysis established that high stress and university setting was associated with dropout from university studies, while symptoms of depression and anxiety as well as hazardous drinking were not. Outcome was analysed in students remaining at university at one-year follow-up. The primary interventions offered to freshmen at the intervention university reduced alcohol expectancies and mental symptoms compared with freshmen at the control university. Secondary stress interventions were effective in reducing mental symptoms and alcohol expectancies. Secondary alcohol interventions were effective in reducing AUDIT scores, alcohol expectancies, estimated blood alcohol concentrations, as well as stress and mental symptoms. In conclusion, both primary and secondary alcohol and stress interventions have one-year effects in university freshmen and could be used in university settings

    A complex network approach to urban growth

    Get PDF
    The economic geography can be viewed as a large and growing network of interacting activities. This fundamental network structure and the large size of such systems makes complex networks an attractive model for its analysis. In this paper we propose the use of complex networks for geographical modeling and demonstrate how such an application can be combined with a cellular model to produce output that is consistent with large scale regularities such as power laws and fractality. Complex networks can provide a stringent framework for growth dynamic modeling where concepts from e.g. spatial interaction models and multiplicative growth models can be combined with the flexible representation of land and behavior found in cellular automata and agent-based models. In addition, there exists a large body of theory for the analysis of complex networks that have direct applications for urban geographic problems. The intended use of such models is twofold: i) to address the problem of how the empirically observed hierarchical structure of settlements can be explained as a stationary property of a stochastic evolutionary process rather than as equilibrium points in a dynamics, and, ii) to improve the prediction quality of applied urban modeling.evolutionary economics, complex networks, urban growth

    Inverse Ontomimetic Simulation: a window on complex systems

    Get PDF
    The present paper introduces "ontomimetic simulation" and argues that this class of models has enabled the investigation of hypotheses about complex systems in new ways that have epistemological relevance. Ontomimetic simulation can be differentiated from other types of modeling by its reliance on causal similarity in addition to representation. Phenomena are modeled not directly but via mimesis of the ontology (i.e. the "underlying physics", microlevel etc.) of systems and a subsequent animation of the resulting model ontology as a dynamical system. While the ontology is clearly used for computing system states, what is epistemologically important is that it is viewed as a hypothesis about the makeup of the studied system. This type of simulation, where model ontologies are used as hypotheses, is here called inverse ontomimetic simulation since it reverses the typical informational path from the target to the model system. It links experimental and analytical techniques in being explicitly dynamical while at the same time capable of abstraction. Inverse ontomimetic simulation is argued to have a great impact on science and to be the tool for hypothesis-testing that has made systematic theory development for complex systems possible

    On the Property Rights System of the State Enterprises in China

    Get PDF
    Detailed analysis of spinal deformity is important within orthopaedic healthcare, in particular for assessment of idiopathic scoliosis. This paper addresses this challenge by proposing an image analysis method, capable of providing a full three-dimensional spine characterization. The proposed method is based on the registration of a highly detailed spine model to image data from computed tomography. The registration process provides an accurate segmentation of each individual vertebra and the ability to derive various measures describing the spinal deformity. The derived measures are estimated from landmarks attached to the spine model and transferred to the patient data according to the registration result. Evaluation of the method provides an average point-to-surface error of 0.9 mm ± 0.9 (comparing segmentations), and an average target registration error of 2.3 mm ± 1.7 (comparing landmarks). Comparing automatic and manual measurements of axial vertebral rotation provides a mean absolute difference of 2.5° ± 1.8, which is on a par with other computerized methods for assessing axial vertebral rotation. A significant advantage of our method, compared to other computerized methods for rotational measurements, is that it does not rely on vertebral symmetry for computing the rotational measures. The proposed method is fully automatic and computationally efficient, only requiring three to four minutes to process an entire image volume covering vertebrae L5 to T1. Given the use of landmarks, the method can be readily adapted to estimate other measures describing a spinal deformity by changing the set of employed landmarks. In addition, the method has the potential to be utilized for accurate segmentations of the vertebrae in routine computed tomography examinations, given the relatively low point-to-surface error

    Cultural complexity and complexity evolution

    Get PDF
    We review issues stemming from current models regarding the drivers of cultural complexity and cultural evolution. We disagree with the implication of the treadmill model, based on dual-inheritance theory, that population size is the driver of cultural complexity. The treadmill model reduces the evolution of artifact complexity, measured by the number of parts, to the statistical fact that individuals with high skills are more likely to be found in a larger population than in a smaller population. However, for the treadmill model to operate as claimed, implausibly high skill levels must be assumed. Contrary to the treadmill model, the risk hypothesis for the complexity of artifacts relates the number of parts to increased functional efficiency of implements. Empirically, all data on hunter-gatherer artifact complexity support the risk hypothesis and reject the treadmill model. Still, there are conditions under which increased technological complexity relates to increased population size, but the dependency does not occur in the manner expressed in the treadmill model. Instead, it relates to population size when the support system for the technology requires a large population size. If anything, anthropology and ecology suggest that cultural complexity generates high population density rather than the other way around

    Zooming out the microscope on cumulative cultural evolution: ‘Trajectory B’ from animal to human culture

    Get PDF
    It is widely believed that human culture originated in the appearance of Oldowan stone-tool production (circa 2.9 Mya) and a primitive but effective ability to copy detailed know-how. Cumulative cultural evolution is then believed to have led to modern humans and human culture via self-reinforcing gene-culture co-evolution. This outline evolutionary trajectory has come to be seen as all but self-evident, but dilemmas have appeared as it has been explored in increasing detail. Can we attribute even a minimally effective know-how copying capability to Oldowan hominins? Do Oldowan tools really demand know-how copying? Is there any other evidence that know-how copying was present? We here argue that this account, which we refer to as “Trajectory A”, may be a red herring, and formulate an alternative “Trajectory B” that resolves these dilemmas. Trajectory B invokes an overlooked group-level channel of cultural inheritance (the Social Protocell) whereby networks of cultural traits can be faithfully inherited and potentially undergo cumulative evolution, also when the underpinning cultural traits are apelike in not being transmitted via know-how copying (Latent Solutions). Since most preconditions of Trajectory B are present in modern-day Pan, Trajectory B may even have its roots considerably before Oldowan toolmaking. The cumulative build-up of networks of non-cumulative cultural traits is then argued to have produced conditions that both called for and afforded a gradual appearance of the ability to copy know-how, but considerably later than the Oldowan

    A strand specific high resolution normalization method for chip-sequencing data employing multiple experimental control measurements

    Get PDF
    Background: High-throughput sequencing is becoming the standard tool for investigating protein-DNA interactions or epigenetic modifications. However, the data generated will always contain noise due to e. g. repetitive regions or non-specific antibody interactions. The noise will appear in the form of a background distribution of reads that must be taken into account in the downstream analysis, for example when detecting enriched regions (peak-calling). Several reported peak-callers can take experimental measurements of background tag distribution into account when analysing a data set. Unfortunately, the background is only used to adjust peak calling and not as a preprocessing step that aims at discerning the signal from the background noise. A normalization procedure that extracts the signal of interest would be of universal use when investigating genomic patterns. Results: We formulated such a normalization method based on linear regression and made a proof-of-concept implementation in R and C++. It was tested on simulated as well as on publicly available ChIP-seq data on binding sites for two transcription factors, MAX and FOXA1 and two control samples, Input and IgG. We applied three different peak-callers to (i) raw (un-normalized) data using statistical background models and (ii) raw data with control samples as background and (iii) normalized data without additional control samples as background. The fraction of called regions containing the expected transcription factor binding motif was largest for the normalized data and evaluation with qPCR data for FOXA1 suggested higher sensitivity and specificity using normalized data over raw data with experimental background. Conclusions: The proposed method can handle several control samples allowing for correction of multiple sources of bias simultaneously. Our evaluation on both synthetic and experimental data suggests that the method is successful in removing background noise
    corecore